Goto

Collaborating Authors

 massive chip


Cerebras Systems' massive chips are revolutionizing AI

#artificialintelligence

Bigger isn't always better, but sometimes it is. Cerebras Systems, a company bent on accelerating machine learning systems, built the world's largest chip last year. In the time since, it's developed bespoke solutions to some of the largest problems in the AI industry. Founded in 2015, Cerebras is a sort of reunion tour for most of its C-suite executives. Prior to building chips the size of dinner plates, the team was responsible for Sea Micro, a company founded in 2007 that eventually sold to AMD for more than $330 million in 2012.


National Lab Taps AI Machine With Massive Chip to Fight Coronavirus

#artificialintelligence

The system's calling card is a massive chip, measuring 8.5 inches by 8.5 inches. Putting the neural network on the chip, instead of dispersing it across a system, enables problems to be solved faster. Data travels a shorter distance, speeding up the processing of information. "You've got a chip that's about 60 times bigger than any existing chip. That raw capability is what we're trying to exploit," said Rick Stevens, associate laboratory director for computing, environment and life sciences at Argonne. When the AI computer was installed in November, originally for cancer research, the lab determined its computing power was almost equivalent to that of a cluster of computers with up to 300 graphics chips, which are widely used for AI.



The Cerebras CS-1 computes deep learning AI problems by being bigger, bigger, and bigger than any other chip – TechCrunch

#artificialintelligence

Deep learning is all the rage these days in enterprise circles, and it isn't hard to understand why. Whether it is optimizing ad spend, finding new drugs to cure cancer, or just offering better, more intelligent products to customers, machine learning -- and particularly deep learning models -- have the potential to massively improve a range of products and applications. The key word though is'potential.' While we have heard oodles of words sprayed across enterprise conferences the last few years about deep learning, there remain huge roadblocks to making these techniques widely available. Deep learning models are highly networked, with dense graphs of nodes that don't "fit" well with the traditional ways computers process information.